3,365 research outputs found

    Synthesis, physical and chemical properties, and potential applications of graphite fluoride fibers

    Get PDF
    Graphite fluoride fibers can be produced by fluorinating pristine or intercalated graphite fibers. The higher the degree of graphitization of the fibers, the higher the temperature needed to reach the same degree of fluorination. Pitched based fibers were fluorinated to flourine-to-carbon atom rations between 0 and 1. The graphite fluoride fibers with a fluorine-to-carbon atom ration near 1 have extensive visible structural damage. On the other hand, fluorination of fibers pretreated with bromine or fluorine and bromine result in fibers with a fluorine-to-carbon atom ratio nearly equal to 0.5 with no visible structural damage. The electrical resistivity of the fibers is dependent upon the fluorine to carbon atom ratio and ranged from .01 to 10 to the 11th ohm/cm. The thermal conductivity of these fibers ranged from 5 to 73 W/m-k, which is much larger than the thermal conductivity of glass, which is the regular filler in epoxy composites. If graphite fluoride fibers are used as a filler in epoxy or PTFE, the resulting composite may be a high thermal conductivity material with an electrical resistivity in either the insulator or semiconductor range. The electrically insulating product may provide heat transfer with lower temperature gradients than many current electrical insulators. Potential applications are presented

    Incorporating Skew into RMS Surface Roughness Probability Distribution

    Get PDF
    The standard treatment of RMS surface roughness data is the application of a Gaussian probability distribution. This handling of surface roughness ignores the skew present in the surface and overestimates the most probable RMS of the surface, the mode. Using experimental data we confirm the Gaussian distribution overestimates the mode and application of an asymmetric distribution provides a better fit. Implementing the proposed asymmetric distribution into the optical manufacturing process would reduce the polishing time required to meet surface roughness specifications

    Effects of sequential treatment with fluorine and bromine on graphite fibers

    Get PDF
    Three pitch based graphite fibers with different degrees of graphitization and one polyacryonitrile (PAN) based carbon fiber from Amoco Corporation were treated with 1 atm, room temperature fluorine gas for 90 hrs. Fluorination resulted in higher electrical conductivity for all pitch fibers. Further bromination after ambient condition defluorination resulted in further increases in electrical defluorination conductivity for less graphitized, less structurally ordered pitch fibers (P-55) which contain about 3% fluorine by weight before bromination. This product can be stable in 200 C air, or 100% humidity at 60 C. Due to its low cost, this less graphitized fiber may be useful for industrial application, such as airfoil deicer materials. The same bromination process, however, resulted in conductivity decreases for fluorine rich, more graphitized, structurally oriented pitch fibers (P-100 and P-75). Such decreases in electrical conductivity were partially reversed by heating the fibers at 185 C in air. Differential scanning calorimetric (DSC) data indicated that the more graphitized fibers (P-100) contained BrF3, whereas the less graphitized fibers (P-55) did not

    Measuring Skew in Average Surface Roughness as a Function of Surface Preparation

    Get PDF
    Characterizing surface roughness is important for predicting optical performance. Better measurement of surface roughness reduces grinding saving both time and money and allows the science requirements to be better defined. In this study various materials are polished from a fine grind to a fine polish. Each sample's RMS surface roughness is measured at 81 locations in a 9x9 square grid using a Zygo white light interferometer at regular intervals during the polishing process. Each data set is fit with various standard distributions and tested for goodness of fit. We show that the skew in the RMS data changes as a function of polishing time

    Technofixing the Future: Ethical Side Effects of Using AI and Big Data to meet the SDGs

    Get PDF
    While the use of smart information systems (the combination of AI and Big Data) offer great potential for meeting many of the UN’s Sustainable Development Goals (SDGs), they also raise a number of ethical challenges in their implementation. Through the use of six empirical case studies, this paper will examine potential ethical issues relating to use of SIS to meet the challenges in six of the SDGs (2, 3, 7, 8, 11, and 12). The paper will show that often a simple “technofix”, such as through the use of SIS, is not sufficient and may exacerbate, or create new, issues for the development community using SIS

    Towards online concept drift detection with feature selection for data stream classification

    Get PDF
    Data Streams are unbounded, sequential data instances that are generated very rapidly. The storage, querying and mining of such rapid flows of data is computationally very challenging. Data Stream Mining (DSM) is concerned with the mining of such data streams in real-time using techniques that require only one pass through the data. DSM techniques need to be adaptive to reflect changes of the pattern encoded in the stream (concept drift). The relevance of features for a DSM classification task may change due to concept drifts and this paper describes the first step towards a concept drift detection method with online feature tracking capabilities

    Fast adaptive real-time classification for data streams with concept drift

    Get PDF
    An important application of Big Data Analytics is the real-time analysis of streaming data. Streaming data imposes unique challenges to data mining algorithms, such as concept drifts, the need to analyse the data on the fly due to unbounded data streams and scalable algorithms due to potentially high throughput of data. Real-time classification algorithms that are adaptive to concept drifts and fast exist, however, most approaches are not naturally parallel and are thus limited in their scalability. This paper presents work on the Micro-Cluster Nearest Neighbour (MC-NN) classifier. MC-NN is based on an adaptive statistical data summary based on Micro-Clusters. MC-NN is very fast and adaptive to concept drift whilst maintaining the parallel properties of the base KNN classifier. Also MC-NN is competitive compared with existing data stream classifiers in terms of accuracy and speed

    Quality Assurance Based Healthcare Information System Design

    Get PDF
    Despite decades of research, health information systems have been characterised by cost over-runs, poor specifications and lack of user uptake. We propose an alternative approach to their design. By viewing health care as a process and quality as continuously seeking iterative improvements to processes, an objectoriented analysis reveals a class model, which supports quality assurance (QA). At the heart of the model is the ability to store actions for comparison with intentions. Measurement of the proportion of planned tasks that are executed provides a basis for identifying when to alter a process. We show that the model is able to represent medical and administrative procedures and argue that it forms an electronic record suitable for health care organisations. Were this record to become a standard, software could be developed close to the point of use, in harmony with the needs of stakeholders, so avoiding many criticisms of health information systems

    On Quality and Communication: The Relevance of Critical Theory to Health Informatics

    Get PDF
    Health information systems require long-term investment before they provide a socio-economic return, yet their implementation remains problematic, possibly because the claims made about them appear not to sit well with healthcare professionals’ practice. Health informatics should address these issues from a sound conceptual base, such as might be provided by critical theory, which seeks to identify hidden assumptions and ideologies. This discipline can provide a better understanding of the inner workings of socio-technical systems, with a view to improving them through the promotion of emancipation (allowing people to fulfill their potential). Critical theory can also shed light on the problems with health information systems and offer insight into remedies, for example, by relating Habermas’ theories about communication to feedback, a concept central to quality assurance (QA). Such analysis finds that QA’s principal practices can be interpreted as emancipatory but requires organizations to substantially change their behavior. An alternate approach is to install health information systems designed to support QA. Applying critical theory to these systems shows that they could become an active part of service delivery rather than static repositories of data, because they may encourage standardized conversations between all stakeholders about the important features of health care. Success will depend on access for all participants to data entry and analysis tools, integration with work practice, and use by staff and management in QA. These ideas offer new directions for research into and the development of health information systems. The next step will be to implement them and observe their technical and emancipatory properties

    HabEx Telescope WFE Stability Specification Derived from Coronagraph Starlight Leakage

    Get PDF
    HabEx is a space-based 4-meter diameter telescope with ultraviolet (UV), optical, and near-infrared (near-IR) imaging and spectroscopy capabilities. Three driving science goals during its five-year primary mission: 1. To seek out nearby worlds and explore their habitability.; 2. To map out nearby planetary systems and understand the diversity of the worlds they contain.; 3. To carry out observations that open up new windows on the universe from the UV through near-IR
    corecore